82

7

The Transmission of Information

The problem faced by the decoder (inverter) is that although each transition gives

unambiguous information about the parameter value under which it occurred, the two

states involved did not exist at the same epoch; hence, one of the decoder’s inputs

must in effect behave now according to what the encoder’s output was. This problem

may be solved by introducing a delayer, represented by the transformation

StartLayout 1st Row 1st Column down arrow 2nd Column q 3rd Column r 4th Column s 2nd Row 1st Column upper Q 2nd Column q 3rd Column q 4th Column q 3rd Row 1st Column upper R 2nd Column r 3rd Column r 4th Column r 4th Row 1st Column upper S 2nd Column s 3rd Column s 4th Column s EndLayout

q

r

s

Q

q

q

q

R

r

r

r

S

s

s

s

(7.3)

The encoder provides input (is joined) to the delayer and the decoder, and the delayer

provides an additional input (is joined) to the decoder (see the following example).

Example. Consider a transducer (encoder) with the transformation n prime equals n plus an, = n + a,

where aa is the input parameter and nn is the variable. 7 The inverting solution of

the transducer’s equation is evidentlya equals n prime minus na = n, n, but sincen primen, andnn are not available

simultaneously, a delayer is required. The delayer should have the transformation

n prime equals pn, = p, with nn as the parameter and pp as the variable. The inverter (decoder) has

variable mm and inputs nn and pp, and its transformation is m prime equals n minus pm, = np. The encoder’s

input to the delayer and the decoder isnn, and the delayer’s to the decoder is its state

pp.

Problem. Start the transducer in the above example withn equals 3n = 3 and verify the coding–

decoding operation.

Problem. Attempt to find examples of decoders in living organisms.

7.4

Compression

Shannon’s fundamental theorem for a noiseless channel proves that it is possible to

encode the output of an information source in such a way as to transmit at an average

rate equal to the channel capacity.

This is of considerable importance in telephony, which mostly deals with the

transmission of natural language. Shannon found by an empirical method that the

redundancy of the English language (due to syntactical constraint) is about 0.5.

Hence, by suitably encoding the output of an English-speaking source, the capacity

of a channel may be effectively doubled.

This compression process is well illustrated by an example due to Shannon. Con-

sider a source producing a sequence of letters chosen from among A, B, C, and D. Our

first guess would be that the four symbols were being chosen with equal probabilities

ofone fourth 1

4, and hence the average information rate per symbol would belog Subscript 2 Baseline 4 equals 2log2 4 = 2 bits per

symbol. However, suppose that after a long delay we ascertain from the frequencies

7 Due to Ashby (1956).